AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Multi-source Medical Data

# Multi-source Medical Data

Medalpaca 13b
CC
MedAlpaca 13b is a large language model specifically fine-tuned for medical domain tasks, based on the LLaMA architecture with 13 billion parameters, designed to enhance performance in medical Q&A and dialogue tasks.
Large Language Model Transformers English
M
medalpaca
558
86
Medbert
MIT
MedBERT is a Transformer-based pretrained language model specifically designed for biomedical named entity recognition tasks. It is initialized based on Bio_ClinicalBERT and pretrained on multiple biomedical datasets.
Sequence Labeling Transformers English
M
Charangan
17.31k
13
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase